Web Survey Bibliography
Abstract #1:
The growing usage of smartphone applications (or “apps”), particularly among young adults, has opened a new frontier for data collection. This emerging method of omputer-Assisted Self-Interviewing (CASI) offers new techniques to engage respondents on the mobile platform in response to the persistent challenge of respondent cooperation. The use of game mechanics has been integrated with smartphone apps in the recent years to draw on the intrinsic motivation for users to engage in a task. The tools for game mechanics such as points, badges, levels, challenges and leaderboards are used to motivate desired behaviors (i.e., “gamifying” the process but not necessarily turning the task completely into a “game”). Moreover, “social sharing” on networks such as Facebook is a defining attribute for today’s youth and a critical feature of some of the most successful apps. The mechanics of “social sharing” such as comments, posting updates or “liking” the status of others are engaging features to connect the users within the app community and social networks such as Facebook. Leveraging both game and social echanics for mobile app research can maximize respondent engagement for longitudinal data collection. To measure these emerging techniques for engagement, Nielsen will conduct a split sample experiment contrasting two versions of the iPhone app to collect media usage information. One version of the app will be fully integrated with game and social mechanics while the other version will be initiated without these features, then add the game and social mechanics in phases. This experiment is expected to gather learning on the effectiveness of these emerging techniques for respondent engagement and demonstrate whether data collection via smartphone app is a viable method for repeated measures of the hard-to-reach younger cohorts.
Abstract #2:
This research explores the utility of Facebook applications as survey and passive data collection platforms. Facebook applications are interactive, user-facing tools that allow users to interact and enhance the user experience through social games, quizzes, and other interactive and social tools. Facebook’s Graph Application Programming Interface offers researchers the tools necessary to develop applications that can allow access to both public and private data (provided access is permitted), which in turn offer data collection capabilities survey researchers are now only beginning to understand. Traditional survey questionnaires constrain not only the type of data that can be collected, but also the volume, accuracy, and timeliness of the data and data collection process. Facebook’s Graph API is revolutionizing the way we conceptualize the word “data,” which has major implications across a variety of dimensions directly related to the field of survey research. For instance, Facebook offers the capability to stream data in real-time, drawing from a user-base of over 800 million, and in forms that are both new (e.g. location check-in, social networks, and status updates) and old (e.g. demographic data). Facebook applications offer researchers an opportunity to develop unique approaches to address research questions. These applications also provide a platform for questionnaire administration, data creation, and passive data collection in real-time. This paper explores the applicability of Facebook applications, such as social gaming and Facebook user-experience-enhancing applications, to survey research. Results from a pilot study involving the use of a Facebook application to engage the social networks of military personnel to build registries will be used to explore the potential uses of applications. Specifically, this research aims to provide a better understanding of the new types of data being developed, Facebook applications as a mode of questionnaire administration, participant recruitment, implications to sample development, and limitations to be addressed going forward.
Abstract #3:
This research explores the utility of Facebook applications as survey and passive data collection platforms. Facebook applications are interactive, user-facing tools that allow users to interact and enhance the user experience through social games, quizzes, and other interactive and social tools. Facebook’s Graph Application Programming Interface offers researchers the tools necessary to develop applications that can allow access to both public and private data (provided access is permitted), which in turn offer data collection capabilities survey researchers are now only beginning to understand. Traditional survey questionnaires constrain not only the type of data that can be collected, but also the volume, accuracy, and timeliness of the data and data collection process. Facebook’s Graph API is revolutionizing the way we conceptualize the word “data,” which has major implications across a variety of dimensions directly related to the field of survey research. For instance, Facebook offers the capability to stream data in real-time, drawing from a user-base of over 800 million, and in forms that are both new (e.g. location check-in, social networks, and status updates) and old (e.g. demographic data). Facebook applications offer researchers an opportunity to develop unique approaches to address research questions. These applications also provide a platform for questionnaire administration, data creation, and passive data collection in real-time. This paper explores the applicability of Facebook applications, such as social gaming and Facebook user-experience-enhancing applications, to survey research. Results from a pilot study involving the use of a Facebook application to engage the social networks of military personnel to build registries will be used to explore the potential uses of applications. Specifically, this research aims to provide a better understanding of the new types of data being developed, Facebook applications as a mode of questionnaire administration, participant recruitment, implications to sample development, and limitations to be addressed going forward.
Abstract #4:
The Randomized Response Technique (RRT) is used to encourage accurate responding to sensitive survey questions. When using the RRT, respondents are given two questions (one sensitive and the other nonsensitive with a known response distribution) and are instructed to answer one of them. The question to be answered is determined by the outcome of a random act with a known probability (e.g. a coin toss), that only the respondent sees. Researchers do not know which question each respondent answered, but are able to calculate proportions for each response to the sensitive question. Though it is designed to reduce error, the RRT may actually increase measurement error if respondents implement it incorrectly. Evaluating the RRT is challenging because the outcome of its driving feature, the randomizer, is concealed from researchers. As a result, prior research has typically assumed that higher reporting of undesirable responses signals the RRT’s success. Eight RRT items were evaluated in a non-probability survey of 75 participants of the online virtual world, Second Life (SL). Participants were randomly assigned to one of three modes: face-to-face interview in SL, voice chat interview in SL, or web. The randomizer across all modes was an interactive, 3-dimensional virtual coin toss that was discreetly manipulated by the researchers in order to determine with near certainty whether participants followed the procedure. Only 67% of participants followed the procedure for every RRT item. The greatest rate of procedural noncompliance on an item was 13%. In a true application of the RRT, such noncompliance would result in greatly inflated estimates. There were no significant differences in RRT compliance by demographic characteristics or survey mode. Most participants indicated in debriefing questions that they enjoyed this method of answering questions, but their noncompliance is cause for additional skepticism about using the RRT.
Conference Homepage (abstract)
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.